Margin and Radius Based Multiple Kernel Learning

نویسندگان

  • Huyen Do
  • Alexandros Kalousis
  • Adam Woznica
  • Melanie Hilario
چکیده

A serious drawback of kernel methods, and Support Vector Machines (SVM) in particular, is the difficulty in choosing a suitable kernel function for a given dataset. One of the approaches proposed to address this problem is Multiple Kernel Learning (MKL) in which several kernels are combined adaptively for a given dataset. Many of the existing MKL methods use the SVM objective function and try to find a linear combination of basic kernels such that the separating margin between the classes is maximized. However, these methods ignore the fact that the theoretical error bound depends not only on the margin, but also on the radius of the smallest sphere that contains all the training instances. We present a novel MKL algorithm that optimizes the error bound taking account of both the margin and the radius. The empirical results show that the proposed method compares favorably with other state-of-the-art MKL

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning Kernels with Radiuses of Minimum Enclosing Balls

In this paper, we point out that there exist scaling and initialization problems in most existing multiple kernel learning (MKL) approaches, which employ the large margin principle to jointly learn both a kernel and an SVM classifier. The reason is that the margin itself can not well describe how good a kernel is due to the negligence of the scaling. We use the ratio between the margin and the ...

متن کامل

F-SVM: Combination of Feature Transformation and SVM Learning via Convex Relaxation

The generalization error bound of support vector machine (SVM) depends on the ratio of radius and margin, while standard SVM only considers the maximization of the margin but ignores the minimization of the radius. Several approaches have been proposed to integrate radius and margin for joint learning of feature transformation and SVM classifier. However, most of them either require the form of...

متن کامل

Learning Kernel Parameters by using Class Separability Measure

Learning kernel parameters is important for kernel based methods because these parameters have significant impact on the generalization abilities of these methods. Besides the methods of Cross-Validation and Leave-One-Out, minimizing some upper bounds on the generalization error, such as the radius-margin bound, was also proposed to more efficiently learn the optimal kernel parameters. In this ...

متن کامل

Learning Kernels via Margin-and-Radius Ratios

Despite the great success of SVM, it is usually difficult for users to select suitable kernels for SVM classifiers. Kernel learning has been developed to jointly learn both a kernel and an SVM classifier [1]. Most existing kernel learning approaches, e.g., [2, 3, 4], employ the margin based formulation, equivalent to: mink,w,b,ξi 1 2‖w‖ 2 + C ∑ i ξi, s.t. yi〈φ(xi; k), w〉+ b+ ξi ≥ 1, ξi ≥ 0, (1)...

متن کامل

Ratio-Based Multiple Kernel Clustering

Maximum margin clustering (MMC) approaches extend the large margin principle of SVM to unsupervised learning with considerable success. In this work, we utilize the ratio between the margin and the intra-cluster variance, to explicitly consider both the separation and the compactness of the clusters in the objective. Moreover, we employ multiple kernel learning (MKL) to jointly learn the kernel...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009